Amos Golan ( American U . ) and Aman Ullah ( UC Riverside )
نویسنده
چکیده
In many distributed sensing problems, resource constraints preclude the utilization of all sensing assets. By way of example, inference in distributed sensor networks presents a fundamental trade-o_ between the utility in a distributed set of measurements versus the resources expended to acquire them, fuse them into a model of uncertainty, and then transmit the resulting model. Active approaches seek to manage sensing resources so as to maximize a utility function while incorporating constraints on resource expenditures. Such approaches are complicated by several factors. Firstly, the complexity of sensor planning is typically exponential in both the number of sensing actions and the planning time horizon. Consequently, optimal planning methods are intractable excepting for very small scale problems. Secondly, the choice of utility function may vary over time and across users. Approximate approaches (c.f.[Zhao et al., 2002, Kreucher et al., 2005]) have been proposed that treat a subset of these issues; however, the approaches are indirect and do not scale to large problems. In this presentation, I will discuss the use of information measures for resource allocation in distributed sensing systems. Such measures are appealing due to a variety of useful properties. For example, recent results of [Nguyen et al., 2009] link a class of information measures to surrogate risk functions and their associated bounds on excess risk [Bartlett et al., 2003]. Consequently, these measures are suitable proxies for a wide variety of risk functions. I will discuss a method [Williams et al., 2007a] which enables long time-horizon sensor planning in the context of state estimation with a distributed sensor network. The approach integrates the value of information discounted by resource expenditures over a rolling time horizon. Simulation results demonstrate that the resulting algorithm can provide similar estimation performance to that of greedy and myopic methods for a fraction of the resource expenditures. Furthermore, recently developed methods [Fisher III et al., 2009] have been shown to be useful for estimating these quantities in complex signal models. Finally, one consequence of this algorithmic development are new fundamental performance bounds for information gathering systems [Williams et al., 2007b] which show that, under mild assumptions, optimal (though intractable) planning schemes can yield no better than twice the performance of greedy methods for certain choices of information measures. The bound can be shown to be sharp. Additional on-line computable bounds, often tighter in practice, are presented as well. This is joint work with Georgios Papachristoudous, Jason L. Williams, & Michael Siracusa Ximing Wu, (Texas A & M) Semiparametric Estimations with Shape Restrictions (joint with Robin Sickles, Rice U.) Economic theories often provide useful guidance on the modeling of real world data. For instance, utility function associated with rational preference is monotone. In addition, under convex preference, we obtain quasiconcavity. Demand functions of normal goods are downward sloping. According to the duality theorem, profit functions are concave in output price, while cost functions are monotonically increasing and concave in input price. Researchers, when trying to model economic relationships, face two challenges: fidelity to economic theories and flexibility in functional forms. These two goals are often at odds: conformity to theories usually dictates rigid functional forms, while flexible parameterizations sometimes lead to counterintuitive predictions. One fruitful approach of research to tackle this problem is to use nonparametric or seimparametric methods subject to restrictions suggested by economic theories. In this paper we present a flexible semiparametric estimator that incorporates shape constraints. We focus on functional relationships with two constraints: monotonicity and concavity because this is the class of functions encountered most frequently in economic studies. Functional relationships with either one of these two constraints are special cases of our estimator. Convexity can be easily accommodated by a simple negation of one parameter in our model. We use integral transformation defined by differential functions to impose shape restrictions. A key advantage of this transformation approach is that it transforms a constrained problem into an unconstrained one. We subsequently model the unconstrained problem using penalized spline methods, resulting in a nonlinear semiparametric estimator. In addition, we show that careful choice of model-based penalty can simplify the estimation considerably. We propose an iterative algorithm to solve the proposed estimator. We also present approximate methods of inferences and smoothing parameter selection. We further extend our models to multiple regressions within the framework of additive models. We illustrate the finite sample performance and usefulness of our methods with Monte Carlo simulations and two empirical applications. Dennis Glennon, (OCC) Nonparametric Estimation of Probability of Default with Varying Coefficient (joint with Bin Chen, U. Rochester, Qingqing Chen, OCC, and Yongmiao Hong, Cornell U.) As the recent economic and financial crisis unfolded, the risks associated with the dependence on default probability predictions from models that are not designed to be robust against changes in the economic environment became apparent in the form of historically large and unexpected credit losses. In this paper, we use a large nonprime mortgage data set made up of loans originated from 2001 2009 to construct a representative default model using industry accepted practices. We use that model as a benchmark to evaluate the potential benefit of adopting an alternative model design. We show that a model developed using industry accepted practices performs poorly post 2007 even after including macroeconomic variables to capture the increased recognition that large changes in systematic factors are important drives of defaults. An analysis of the growth in prediction errors over time suggests that one potential contributing factor is the rigidity of the fixed coefficient design. We propose a nonparametric time varying coefficient model to forecast the probability of default conditional on both borrower-specific and macroeconomic conditions. The idea is to estimate smooth time-varying parameters by local smoothing and compare the fitted values of the restricted constant parameter model and the unrestricted time-varying parameter model. The method can capture a wide variety of linearities and nonlinearities without assuming any parametric form and also does not require prior information about the alternative. Our preliminary results show that nonparametric time varying coefficient method improves model performance relative to that of a conventional modeling approach. We compare model performance prior to and post crisis and show that conditioning on borrower-specific factors and time varying economic and market conditions greatly improves out-of-sample forecast accuracy especially when there exist dramatic changes in the economic environment. Hwan-sik Choi (Purdue U.) Expert Information and Nonparametric Bayesian Inference of Rare Events Inference on rare events is difficult because historical data information is scarce. Although a parametric model would allow us to use both frequent and rare events for inference on rare events, it is still difficult to check if the specification of the tail part of a parametric model is correct because of the scarcity of data on the tail. Therefore, I propose to use a nonparametric model to deal with the concerns of misspecification. Of course, a nonparametric model would reduce efficiency of data. To remedy the loss of efficiency, I use expert information as an additional source of information. Expert information can be combined with data information in a Bayesian framework. Specifically, I use the Dirichlet process mixture (DPM) model and merge expert information into the DP prior. Expert knowledge is elicited as moment conditions on a finite dimensional parameter derived from the sampling distribution of the DPM model, and we modify the DP prior to satisfy expert's moment conditions. For this modification, the DP prior is projected onto the space of priors that comply with expert knowledge by minimizing the Kullback-Leibler information criterion. The resulting prior distribution is given by exponentially tilting the DP prior along the expert's parameter of interest. I also discuss extensions to semiparametric models. To implement our approach, I provide a Metropolis-Hastings algorithm to sample from posterior distributions with the exponentially tilted DP prior. The proposed method combines prior information from an econometrician and an expert by finding the least-informative prior given expert information. Diego Ronchetti (Columbia Business School and University of Lugano) An Empirical Study of Stock and American Option Prices The work consists of an empirical study of the daily price of U.S. stocks and American options with high trading volume. The aim is to study the stochastic dynamics of asset prices and measure the risk reward for investors. The view of asset prices as expected payoffs discounted by both time and risk and the idea that there are no arbitrage possibilities in liquid and efficient markets are among the most solid and established economic theories. However, quantitative studies of stock and American option prices are typically characterized by two major approaches. One estimates the dynamics of the determinants of asset prices in a nonparametric way while neglecting the aforementioned theories. The other approach accounts for these theories but considers arbitrary parametric models of the dynamics of asset prices. The former approach does not take advantage of useful information, while the latter introduces an important source of model risk that can lead to erroneous conclusions. The present work considers prices as discounted expected payoffs and stock and option markets as free of arbitrage opportunities in accordance with the above theories. At the same time it avoids the pitfalls of arbitrary parametric modeling through the use of semiparametric estimation techniques. The study focuses on the common stocks of the components of the Dow Jones Industrial Average traded at New York Stock Exchange and on some American options written on them, which are traded in U.S. centralized markets in the period from January 2006 to August 2008. The analysis takes asset portfolios and a nonparametric measure of aggregate volatility as proxies for the systematic determinants of prices. The joint dynamics of these proxies as well as stock returns and their volatilities are estimated without any parameterization of their historical dynamics. The estimation is in accordance with the theory of no-arbitrage and accounts for the option early exercise. The study finds that the heaviest discounts of asset payoffs are the ones for extreme values of common volatility, followed in sequence by momentum, market and size factors. In addition, the paper illustrates the benefits provided by the theory of no-arbitrage for the estimation of the historical joint dynamics of risk factors, stock returns and their volatilities.
منابع مشابه
Information and Entropy Econometrics – Editor’s View
Amos Golan Department of Economics, American University, Roper 200, 4400 Massachusetts Ave., NW, Washington, DC 20016, USA.
متن کاملU.S. Navy Promotion and Retention by Race and Sex
The Navy’s promotion-retention process involves two successive decisions: The Navy decides whether an individual is selected for promotion, and then, conditional on the Navy’s decision, the sailor decides whether to reenlist or leave the Navy. Rates of promotion and retention depend on individuals’ demographic and other characteristics, wars and economic conditions and factors that the Navy ...
متن کاملAn Overview of the Special Regressor Method
This chapter provides background for understanding and applying special regressor methods. This chapter is intended for inclusion in the "Handbook of Applied Nonparametric and Semiparametric Econometrics and Statistics," Co-edited by Aman Ullah, Jeffrey Racine, and Liangjun Su, to be published by Oxford University Press
متن کاملXiming Wu*, Jeffrey M. Perloff,** and Amos Golan***
A variety of parametric and semiparametric models produce qualitatively similar estimates of government policies’ effects on income distribution and welfare (as measured by the Gini, standard deviation of logarithms, relative mean deviation, coefficient of variation, and various Atkinson indexes). Taxes and the Earned Income Tax Credit are an effective way to redistribute income to the poor and...
متن کامل